1,484 research outputs found

    Towards More Data-Aware Application Integration (extended version)

    Full text link
    Although most business application data is stored in relational databases, programming languages and wire formats in integration middleware systems are not table-centric. Due to costly format conversions, data-shipments and faster computation, the trend is to "push-down" the integration operations closer to the storage representation. We address the alternative case of defining declarative, table-centric integration semantics within standard integration systems. For that, we replace the current operator implementations for the well-known Enterprise Integration Patterns by equivalent "in-memory" table processing, and show a practical realization in a conventional integration system for a non-reliable, "data-intensive" messaging example. The results of the runtime analysis show that table-centric processing is promising already in standard, "single-record" message routing and transformations, and can potentially excel the message throughput for "multi-record" table messages.Comment: 18 Pages, extended version of the contribution to British International Conference on Databases (BICOD), 2015, Edinburgh, Scotlan

    Does the magnetization transfer effect bias chemical exchange saturation transfer effects? Quantifying chemical exchange saturation transfer in the presence of magnetization transfer

    Get PDF
    Purpose Chemical exchange saturation transfer (CEST) is an MRI technique sensitive to the presence of low‐concentration solute protons exchanging with water. However, magnetization transfer (MT) effects also arise when large semisolid molecules interact with water, which biases CEST parameter estimates if quantitative models do not account for macromolecular effects. This study establishes under what conditions this bias is significant and demonstrates how using an appropriate model provides more accurate quantitative CEST measurements. Methods CEST and MT data were acquired in phantoms containing bovine serum albumin and agarose. Several quantitative CEST and MT models were used with the phantom data to demonstrate how underfitting can influence estimates of the CEST effect. CEST and MT data were acquired in healthy volunteers, and a two‐pool model was fit in vivo and in vitro, whereas removing increasing amounts of CEST data to show biases in the CEST analysis also corrupts MT parameter estimates. Results When all significant CEST/MT effects were included, the derived parameter estimates for each CEST/MT pool significantly correlated (P < .05) with bovine serum albumin/agarose concentration; minimal or negative correlations were found with underfitted data. Additionally, a bootstrap analysis demonstrated that significant biases occur in MT parameter estimates (P < .001) when unmodeled CEST data are included in the analysis. Conclusions These results indicate that current practices of simultaneously fitting both CEST and MT effects in model‐based analyses can lead to significant bias in all parameter estimates unless a sufficiently detailed model is utilized. Therefore, care must be taken when quantifying CEST and MT effects in vivo by properly modeling data to minimize these biases

    Improving rainfall erosivity estimates using merged TRMM and gauge data

    Get PDF
    Soil erosion is a global issue that threatens food security and causes environmental degradation. Management of water erosion requires accurate estimates of the spatial and temporal variations in the erosive power of rainfall (erosivity). Rainfall erosivity can be estimated from rain gauge stations and satellites. However, the time series rainfall data that has a high temporal resolution are often unavailable in many areas of the world. Satellite remote sensing allows provision of the continuous gridded estimates of rainfall, yet it is generally characterized by significant bias. Here we present a methodology that merges daily rain gauge measurements and the Tropical Rainfall Measuring Mission (TRMM) 3B42 data using collocated cokriging (ColCOK) to quantify the spatial distribution of rainfall and thereby to estimate rainfall erosivity across China. This study also used block kriging (BK) and TRMM to estimate rainfall and rainfall erosivity. The methodologies are evaluated based on the individual rain gauge stations. The results from the present study generally indicate that the ColCOK technique, in combination with TRMM and gauge data, provides merged rainfall fields with good agreement with rain gauges and with the best accuracy with rainfall erosivity estimates, when compared with BK gauges and TRMM alone

    4D-PET reconstruction using a spline-residue model with spatial and temporal roughness penalties

    Get PDF
    4D reconstruction of dynamic positron emission tomography (dPET) data can improve the signal-to-noise ratio in reconstructed image sequences by fitting smooth temporal functions to the voxel time-activity-curves (TACs) during the reconstruction, though the optimal choice of function remains an open question. We propose a spline-residue model, which describes TACs as weighted sums of convolutions of the arterial input function with cubic B-spline basis functions. Convolution with the input function constrains the spline-residue model at early time-points, potentially enhancing noise suppression in early time-frames, while still allowing a wide range of TAC descriptions over the entire imaged time-course, thus limiting bias. &#13; Spline-residue based 4D-reconstruction is compared to that of a conventional (non-4D) maximum a posteriori (MAP) algorithm, and to 4D-reconstructions based on adaptive-knot cubic B-splines, the spectral model and an irreversible two-tissue compartment ('2C3K') model. 4D reconstructions were carried out using a nested-MAP algorithm including spatial and temporal roughness penalties. The algorithms were tested using Monte-Carlo simulated scanner data, generated for a digital thoracic phantom with uptake kinetics based on a dynamic [18F]-Fluromisonidazole scan of a non-small cell lung cancer patient. For every algorithm, parametric maps were calculated by fitting each voxel TAC within a sub-region of the reconstructed images with the 2C3K model. &#13; Compared to conventional MAP reconstruction, spline-residue-based 4D reconstruction achieved &gt;50% improvements for 5 of the 8 combinations of the 4 kinetics parameters for which parametric maps were created with the bias and noise measures used to analyse them, and produced better results for 5/8 combinations than any of the other reconstruction algorithms studied, while spectral model-based 4D reconstruction produced the best results for 2/8. 2C3K model-based 4D reconstruction generated the most biased parametric maps. Inclusion of a temporal roughness penalty function improved the performance of 4D reconstruction based on the cubic B-spline, spectral and spline-residue models.&#13

    Dietary and supplemental long-chain omega-3 fatty acids as moderators of cognitive impairment and Alzheimer’s disease

    Get PDF
    Purpose: There is an ever-growing body of literature examining the relationship between dietary omega-3 polyunsaturated fatty acids (ω3 PUFAs) and cerebral structure and function throughout life. In light of this, the use of ω3 PUFAs, namely, long-chain (LC) ω3 PUFAs (i.e., eicosapentaenoic acid and docosahexaenoic acid), as a therapeutic strategy to mitigate cognitive impairment, and progression to Alzheimer’s disease is an attractive prospect. This review aims to summarise evidence reported by observational studies and clinical trials that investigated the role of LC ω3 PUFAs against cognition impairment and future risk of Alzheimer’s disease. Methods: Studies were identified in PubMed and Scopus using the search terms “omega-3 fatty acids”, “Alzheimer’s disease” and “cognition”, along with common variants. Inclusion criteria included observational or randomised controlled trials (RCTs) with all participants aged ≥ 50 years that reported on the association between LC ω3 PUFAs and cognitive function or biological markers indicative of cognitive function linked to Alzheimer’s disease. Results: Evidence from 33 studies suggests that dietary and supplemental LC ω3 PUFAs have a protective effect against cognitive impairment. Synaptic plasticity, neuronal membrane fluidity, neuroinflammation, and changes in expression of genes linked to cognitive decline have been identified as potential targets of LC ω3 PUFAs. The protective effects LC ω3 PUFAs on cognitive function and reduced risk of Alzheimer’s disease were supported by both observational studies and RCTs, with RCTs suggesting a more pronounced effect in individuals with early and mild cognitive impairment. Conclusion: The findings of this review suggest that individuals consuming higher amounts of LC ω3 PUFAs are less likely to develop cognitive impairment and that, as a preventative strategy against Alzheimer’s disease, it is most effective when dietary LC ω3 PUFAs are consumed prior to or in the early stages of cognitive decline

    The visualisation of fingermarks on Pangolin scales using gelatine lifters

    Get PDF
    Recent media reports document the plight of the Pangolin and its current position as “the most trafficked mammal in the world”. They are described by some as scaly anteaters as all species are covered in hard keratinous tissue in the form of overlapping scales acting as a “flexible dermal armour”. It is estimated that between 2011 and 2013, 117,000–234,000 pangolins were slaughtered, but the seizures may only represent as little as 10% of the true volume of pangolins being illegally traded. In this paper, methods to visualise fingermarks on Pangolin scales using gelatine lifters is presented. The gelatine lifters provide an easy to use, inexpensive but effective method to help wildlife crime rangers across Africa and Asia to disrupt the trafficking. The gelatine lifting process visualised marks producing clear ridge detail on 52% of the Pangolin scales examined, with a further 30% showing the impression of a finger with limited ridge detail. The paper builds on an initial sociotechnical approach to establishing requirement, then it focuses on the methods and outcomes relating to lifting fingermarks off Pangolin scales using gelatine lifters, providing an evaluation of its use in practice

    Partial Volume Correction in Arterial Spin Labeling Perfusion MRI: A method to disentangle anatomy from physiology or an analysis step too far?

    Get PDF
    The mismatch in the spatial resolution of Arterial Spin Labeling (ASL) MRI perfusion images and the anatomy of functionally distinct tissues in the brain leads to a partial volume effect (PVE), which in turn confounds the estimation of perfusion to a specific tissue of interest such as grey or white matter. This confound occurs because the image voxels contain a mixture of tissues with disparate perfusion properties, leading to estimated perfusion values that reflect primarily the volume proportions of tissues in the voxel rather than the perfusion of any particular tissue of interest within that volume. It is already recognized that PVE influences studies of brain perfusion, and that its effect might be even more evident in studies where changes in perfusion are co-incident with alterations in brain structure, such as studies involving a comparison between an atrophic patient population vs control subjects, or studies comparing subjects over a wide range of ages. However, the application of PVE correction (PVEc) is currently limited and the employed methodologies remain inconsistent. In this article, we outline the influence of PVE in ASL measurements of perfusion, explain the main principles of PVEc, and provide a critique of the current state of the art for the use of such methods. Furthermore, we examine the current use of PVEc in perfusion studies and whether there is evidence to support its wider adoption. We conclude that there is sound theoretical motivation for the use of PVEc alongside conventional, 'uncorrected', images, and encourage such combined reporting. Methods for PVEc are now available within standard neuroimaging toolboxes, which makes our recommendation straightforward to implement. However, there is still more work to be done to establish the value of PVEc as well as the efficacy and robustness of existing PVEc methods

    Confirming the existence of π-allyl-palladium intermediates during the reaction of meta photocycloadducts with palladium(ii) compounds

    Get PDF
    The transient existence of π-allyl-palladium intermediates formed by the reaction of Pd(OAc)2 and anisole-derived meta photocycloadducts has been demonstrated using NMR techniques. The intermediates tended to be short-lived and underwent rapid reductive elimination of palladium metal to form allylic acetates, however this degradation process could be delayed by changing the reaction solvent from acetonitrile to chloroform

    A variational Bayesian method for inverse problems with impulsive noise

    Full text link
    We propose a novel numerical method for solving inverse problems subject to impulsive noises which possibly contain a large number of outliers. The approach is of Bayesian type, and it exploits a heavy-tailed t distribution for data noise to achieve robustness with respect to outliers. A hierarchical model with all hyper-parameters automatically determined from the given data is described. An algorithm of variational type by minimizing the Kullback-Leibler divergence between the true posteriori distribution and a separable approximation is developed. The numerical method is illustrated on several one- and two-dimensional linear and nonlinear inverse problems arising from heat conduction, including estimating boundary temperature, heat flux and heat transfer coefficient. The results show its robustness to outliers and the fast and steady convergence of the algorithm.Comment: 20 pages, to appear in J. Comput. Phy

    N-player quantum games in an EPR setting

    Get PDF
    The NN-player quantum game is analyzed in the context of an Einstein-Podolsky-Rosen (EPR) experiment. In this setting, a player's strategies are not unitary transformations as in alternate quantum game-theoretic frameworks, but a classical choice between two directions along which spin or polarization measurements are made. The players' strategies thus remain identical to their strategies in the mixed-strategy version of the classical game. In the EPR setting the quantum game reduces itself to the corresponding classical game when the shared quantum state reaches zero entanglement. We find the relations for the probability distribution for NN-qubit GHZ and W-type states, subject to general measurement directions, from which the expressions for the mixed Nash equilibrium and the payoffs are determined. Players' payoffs are then defined with linear functions so that common two-player games can be easily extended to the NN-player case and permit analytic expressions for the Nash equilibrium. As a specific example, we solve the Prisoners' Dilemma game for general N2 N \ge 2 . We find a new property for the game that for an even number of players the payoffs at the Nash equilibrium are equal, whereas for an odd number of players the cooperating players receive higher payoffs.Comment: 26 pages, 2 figure
    corecore